105 research outputs found

    Regular graphs maximize the variability of random neural networks

    Full text link
    In this work we study the dynamics of systems composed of numerous interacting elements interconnected through a random weighted directed graph, such as models of random neural networks. We develop an original theoretical approach based on a combination of a classical mean-field theory originally developed in the context of dynamical spin-glass models, and the heterogeneous mean-field theory developed to study epidemic propagation on graphs. Our main result is that, surprisingly, increasing the variance of the in-degree distribution does not result in a more variable dynamical behavior, but on the contrary that the most variable behaviors are obtained in the regular graph setting. We further study how the dynamical complexity of the attractors is influenced by the statistical properties of the in-degree distribution

    Macroscopic equations governing noisy spiking neuronal populations

    Get PDF
    At functional scales, cortical behavior results from the complex interplay of a large number of excitable cells operating in noisy environments. Such systems resist to mathematical analysis, and computational neurosciences have largely relied on heuristic partial (and partially justified) macroscopic models, which successfully reproduced a number of relevant phenomena. The relationship between these macroscopic models and the spiking noisy dynamics of the underlying cells has since then been a great endeavor. Based on recent mean-field reductions for such spiking neurons, we present here {a principled reduction of large biologically plausible neuronal networks to firing-rate models, providing a rigorous} relationship between the macroscopic activity of populations of spiking neurons and popular macroscopic models, under a few assumptions (mainly linearity of the synapses). {The reduced model we derive consists of simple, low-dimensional ordinary differential equations with} parameters and {nonlinearities derived from} the underlying properties of the cells, and in particular the noise level. {These simple reduced models are shown to reproduce accurately the dynamics of large networks in numerical simulations}. Appropriate parameters and functions are made available {online} for different models of neurons: McKean, Fitzhugh-Nagumo and Hodgkin-Huxley models

    On an explicit representation of the solution of linear stochastic partial differential equations with delays

    Get PDF
    Based on the analysis of a certain class of linear operators on a Banach space, we provide a closed form expression for the solutions of certain linear partial differential equations with non-autonomous input, time delays and stochastic terms, which takes the form of an infinite series expansion

    Detection de contours visuels par un modele de champ neuronal

    Get PDF
    ISBN : 978-2-9532965-0-1Les modèles de champs neuronaux sont des outils intéressants pour la modélisation, a l'échelle mésoscopique, des aires corticales du cerveau. Ils possèdent les caractéristiques mathématiques qui les rendent exploitables, a savoir l'existence, l'unicité et la stabilité (sous conditions) de leur solution. Ainsi nous décrivons un modèle de l'aire visuelle primaire du cortex (V1) base sur ce formalisme. Le pattern d'interconnexion des masses neurales que nous choisissons pour modéliser V1 définit le rôle fonctionnel que nous souhaitons étudier : la détection de contours visuels. Ce choix est guide par la volonté d'obtenir un modèle biologiquement plausible et mathématiquement bien pose. L'intérêt de cette approche est qu'elle permet d'augmenter la cohérence des informations locales d'une image (l'orientation des contrastes locaux) grâce à l'information contenue dans l'interconnexion des masses neurales tout en enrichissant la représentation de l'image (une carte discrète position + orientation des contours). Ce modèle permet de mieux comprendre la perception visuelle biologique, en particulier la notion de "bonne continuation" et la détection des contours manquants ou trop bruites

    Multiscale analysis of slow-fast neuronal learning models with noise

    Get PDF
    International audienceThis paper deals with the application of temporal averaging methods to recurrent networks of noisy neurons undergoing a slow and unsupervised modification of their connectivity matrix called learning. Three time-scales arise for these models: (i) the fast neuronal dynamics, (ii) the intermediate external input to the system, and (iii) the slow learning mechanisms. Based on this time-scale separation, we apply an extension of the mathematical theory of stochastic averaging with periodic forcing in order to derive a reduced deterministic model for the connectivity dynamics. We focus on a class of models where the activity is linear to understand the specificity of several learning rules (Hebbian, trace or anti-symmetric learning). In a weakly connected regime, we study the equilibrium connectivity which gathers the entire 'knowledge' of the network about the inputs. We develop an asymptotic method to approximate this equilibrium. We show that the symmetric part of the connectivity post-learning encodes the correlation structure of the inputs, whereas the anti-symmetric part corresponds to the cross correlation between the inputs and their time derivative. Moreover, the time-scales ratio appears as an important parameter revealing temporal correlations

    Relative entropy minimizing noisy non-linear neural network to approximate stochastic processes

    Full text link
    A method is provided for designing and training noise-driven recurrent neural networks as models of stochastic processes. The method unifies and generalizes two known separate modeling approaches, Echo State Networks (ESN) and Linear Inverse Modeling (LIM), under the common principle of relative entropy minimization. The power of the new method is demonstrated on a stochastic approximation of the El Nino phenomenon studied in climate research
    corecore